A picture illustration shows a Facebook logo reflected in a person's eye, March 13, 2015. Facebook relies on thousands of human moderators to review potentially offensive posts, including videos of death, violence, sexual material, abuse and threatening speech. The moderators have about 10 seconds to decide on whether to remove material from the site, according to The Guardian. Facebook told the Guardian that it’s difficult to reach a consensus for a service with nearly 2 billion users. Earlier this month, Facebook said it was hiring an additional 3,000 people to monitor images on the site.
Source: Bangkok Post May 22, 2017 09:56 UTC